Markov Chain Decomposition for Convergence Rate Analysis

نویسندگان

  • Neal Madras
  • Dana Randall
چکیده

In this paper we develop tools for analyzing the rate at which a reversible Markov chain converges to stationarity. Our techniques are useful when the Markov chain can be decomposed into pieces which are themselves easier to analyze. The main theorems relate the spectral gap of the original Markov chains to the spectral gap of the pieces. In the first case the pieces are restrictions of the Markov chain to subsets of the state space; the second case treats a Metropolis-Hastings chain whose equilibrium distribution is a weighted average of equilibrium distributions of other Metropolis-Hastings chains on the same state space.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Relative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain

 In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of...

متن کامل

Decomposition Methods and Sampling Circuits in the Cartesian Lattice

Decomposition theorems are useful tools for bounding the convergence rates of Markov chains. The theorems relate the mixing rate of a Markov chain to smaller, derivative Markov chains, defined by a partition of the state space, and can be useful when standard, direct methods fail. Not only does this simplify the chain being analyzed, but it allows a hybrid approach whereby different techniques ...

متن کامل

Renewal theory and computable convergence rates for geometrically ergodic Markov chains

We give computable bounds on the rate of convergence of the transition probabilities to the stationary distribution for a certain class of geometrically ergodic Markov chains. Our results are different from earlier estimates of Meyn and Tweedie, and from estimates using coupling, although we start from essentially the same assumptions of a drift condition towards a “small set”. The estimates sh...

متن کامل

Oja's algorithm for graph clustering and Markov spectral decomposition

Given a positive definite matrix M and an integer Nm ≥ 1, Oja’s subspace algorithm will provide convergent estimates of the first Nm eigenvalues of M along with the corresponding eigenvectors. It is a common approach to principal component analysis. This paper introduces a normalized stochastic-approximation implementation of Oja’s subspace algorithm, as well as new applications to the spectral...

متن کامل

Efficient Gaussian Sampling for Solving Large-Scale Inverse Problems using MCMC Methods

The resolution of many large-scale inverse problems using MCMC methods requires a step of drawing samples from a high dimensional Gaussian distribution. While direct Gaussian sampling techniques, such as those based on Cholesky factorization, induce an excessive numerical complexity and memory requirement, sequential coordinate sampling methods present a low rate of convergence. Based on the re...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007